![]() method and apparatus for providing application interface parts on peripheral computer devices.
专利摘要:
METHOD AND APPARATUS TO PROVIDE APPLICATION INTERFACE PARTS ON PERIPHERAL COMPUTER DEVICES.The methods and devices allow the display of image parts generated on a first computing device on a second computing device. A master helper application on the first device receivesselections of user content and compute bounding frames in each. The master helper application can expand the system's frame store to retain the selected content and cause the window manager to guide applications to enter content into the expanded frame store. The master auxiliary application can call a slave auxiliary application on the second device to receive the contents of the frame store. The slave auxiliary application stores display data received in a frame store so that the image is displayed. Resizing, blending, and rendering participation for display content can be done either on the first or second device or on a third proxy device. Keystrokes on the second device can be translated into commands executed on the first device. 公开号:BR112012005662A2 申请号:R112012005662-0 申请日:2010-09-14 公开日:2020-09-15 发明作者:Ronen Stern;Joel Linsky;Kurt W. Abrahamson;Babak Foruntanpour 申请人:Qualcomm Incorporated; IPC主号:
专利说明:
a »“ METHOD AND APPARATUS TO PROVIDE APPLICATION INTERFACE PARTS ON PERIPHERAL COMPUTER DEVICES ”Field of the Invention The present invention relates in general to graphical user interfaces and, more specifically, to methods and apparatus to provide application interface parts on peripheral computer devices. Fundamentals Computing devices with graphical user interfaces, such as computer workstations and cell phones, provide users with applications that have a graphical interface. Such a graphical interface allows images to be displayed by applications and Internet web pages. However, current applications can only display images in videos attached to the computer on which the application is running. Summary The various aspects provide a method for displaying selected parts of a display image generated on a first computing device that implements "an auxiliary application master in the video of a second computing device that implements a slave auxiliary application that includes reformatting an image of N 25 display generated by an application that runs on the first computing device in order to fit the video on the second computing device and store the reformatted display image in a frame buffer of the first computing device as a hidden window object under the guidance of the master helper application, transmit the display data from the hidden window object to the second computing device through communication between the master helper application and the helper application to » slave, store the display data of the hidden window object in a frame store of the second computing device under the guidance of the slave auxiliary application and render the display on the second computing device using the display data of the hidden window object stored in the frame storage of the second computing device. Methods can include reformatting a display image by orienting an application that runs on the first computing device to paint a portion of the application's display image in the frame store of the first computing device as a hidden window object and reformatting the data display of the hidden window object to fit the video of the second computing device. Method aspects may include receiving a user input on the first computing device that indicates a selection of the display image to be displayed on the second computing device and reformatting the parts selected for display on the second computing device. Reformatting the display data of the hidden window object to fit the video on the second computing device can be done on the first computing device, and the transmission of the display data from the hidden window object to the second computing device can be performed. include transmitting display data from the resized hidden window object to the second computing device. Alternatively, reformatting the display data of the hidden window object to fit the video of the second computing device can be performed on the second computing device. In another aspect, the methods may include transmitting the display data of the hidden window object '»To a third computing device and reformatting the display data of the hidden window object to fit the video of the second computing device on the third computing device and transmitting the resized display data of the hidden window object from the third computing device for the second computing device. Reformatting the display data of the hidden window object may include processing the display data of the hidden window object so that the data generates the display image compatible with the video of the second computing device. In another method, the first computing device can receive display data from the second computing device and reformat the display data of the hidden window object to generate a single merged display image or a juxtaposed display compatible with the second video. computing device. The transmission of the display data can be carried out via a wireless data link established between the first and second computing devices, such as a link with Bluetoothoe wireless data. Another method may include receiving a user input on the second computing device, communicating information regarding the user input received to the first computing device, correlating communication information regarding the user input received with the display image portion of the application. in order to determine a corresponding user entry for the application running on the first computing device and to communicate the user entry corresponding to the application running on the first computing device. . > Another method may include notifying the second computing device that parts of a display image can be transmitted to it, prompting the user of the second computing device to confirm their agreement to receive the display image portion, determining if the user of the second computing device has confirmed the agreement to receive the part of the display image and receive the display data of the hidden window object on the second computing device if it is determined that the user of the second computing device has confirmed the agreement on receive the part of the display image. Another method may include providing video characteristics of the second computing device to the application that runs on the first computing device and receiving a display image of the application in the frame store in a format compatible with the video of the second computing device. In this regard, the image can be —sized to a video that is larger than the video on the first computing device. Another method may include transmitting the display data of the hidden window object from the second computing device to a third computing device, storing the display data of the hidden window object received in a frame store of the third computing device and rendering a display on the third computing device using the display data of the hidden window object stored in the frame store of the third computing device. Another aspect includes a computing device configured to implement the various methods described above. Another aspect includes a communication system that includes several communication devices configured to implement the various methods described above as a system. In one respect, a programmable processor in each computing device is configured with executable instructions per processor to execute the processes of the preceding methods. In another aspect, computing devices comprise devices for carrying out the processes of the preceding methods. Several aspects also include a computer program product that includes a computer-readable storage medium in which instructions for executing the processes of the preceding methods are stored. Brief Description of the Drawings The accompanying drawings, which are incorporated herein and form part of this report, illustrate exemplary aspects of the invention and, together with the general description presented above and the detailed description presented below, serve to explain the features of the invention. Figure 1 is a block diagram of a communication system suitable for use with the various aspects. Figure 2A is an exemplary application video presented on a mobile device. Figure 2B is an example of video shown on a wristwatch device that includes parts of the application video shown in Figure 2A. Figure 3A is an example of a web page in a web browser screen image. Figure 3B is an example of a video shown on a digital picture frame device that includes «, A part of the web page view shown in Figure 3A. Figure 4 is a block diagram of software components according to one aspect. Figure 5 is a block diagram of software components according to another aspect. Figure 6 is a block diagram of software components according to another aspect. Figure 7 is a block diagram of software components according to another aspect. Figure 8 is a process flow diagram of a method for transferring display mixes to a peripheral device according to an aspect. Figure 9 shows the interaction of a user interface with a mobile device that has a video with a touch screen according to an aspect. Figure 10 is a process flow diagram of a method that port parts of an application video to a peripheral device according to one aspect. Figure 11 is a process flow diagram of a method that portions parts of an application video to a peripheral device according to another aspect. Figure 12 is a process flow diagram of a method that portions parts of an application video to a peripheral device according to one aspect. Figure 13 is a block diagram of software components according to another aspect. Figure 14 is a process flow diagram of a method that portions parts of an application video to a peripheral device according to one aspect. Figure 15 is a block diagram of software components according to another aspect. «, Figure 16 is a block diagram of components of a mobile device suitable for use with the various aspects. Figure 17 is a circuit block diagram of an exemplary computer suitable for use with the various aspects. Figure 18 is a block diagram of components of a peripheral wristwatch device suitable for use with the various aspects. Detailed Description The various aspects will be described in detail with reference to the attached drawings. Whenever possible, the same reference numbers will be used in all drawings to refer to the same or similar parts. References made to specific examples and implementations are for the purpose of illustration and are not intended to limit the scope of the invention or the claims. In this description, the term “exemplary” is used to mean “that serves as an example, occurrence or illustration”. Any implementation described here as "exemplary" should not necessarily be interpreted as preferred or advantageous compared to other implementations. As used herein, the term “mobile device” is intended to cover any form of programmable computing device that may exist or will be developed in the future, that implements a programmable processor and video, including, for example, cell phones, video assistants. personal data (PDAS), palm-top computers, laptop and notebook computers, wireless e-mail receivers (the Blackberryd and Treod devices, for example), multimedia Internet enabled cell phones (the Blackberry Storm6, for example) and electronic devices personal ', similar, which include a wireless communication module, a processor and a memory. the different aspects provide methods and | devices for displaying selected parts of an image generated by an application that runs on a first computing device to be displayed on a display window of a second computing device, which is also referred to herein as a peripheral computing device. For ease of reference, the first computing device that generates a display image is referred to as a "master device", while the The second computing device or peripheral computing device that receives and displays the image is referred to as a "slave device". The various aspects can use specialized applications to help in sharing and communicating display stores of the master and slave devices. For ease of reference, such specialized applications are referred to here as "auxiliary applications". A master helper application can be implemented on the master device to help prepare images and display stores to communicate display data to the slave device, and a slave auxiliary application can be implemented on the slave device to help receive display stores and render the related images. The auxiliary master application that runs on the master device that has privileged access to the low-level subsystem of the master device is included within the operating system. This master helper application allows the user to initiate a display share processed by providing a user input, such as a hot key or mouse click, on the '. master device. The master helper application allows the user to select one or more regions of content displayed on the master device for sharing on a slave device. If the master device has a video with a touchscreen, the user can select regions of the content to share on the server device using a special gesture. The master helper application can allow the user to select multiple regions of the displayed content. The master helper application can compute bounding frames in each of the selected content regions. The master device can discover slave devices that are within communication with the master device, through a Bluetoothê communication link, for example, and allow the user to select a specific slave device to receive the selected content regions for display. Once the slave device has been identified, the auxiliary master application can expand the device's system frame store enough to conserve the identified regions of the content. The master helper application can ask the window manager for the application displaying content within the bounding frame and ask the window manager to direct that application to remove all of its content and place it in the newly allocated frame store. A prompt can be shown to the user to indicate whether the application should still insert primary content “into the store” for display on the master device. The window manager can copy the transmitted view from the application to either or both of the primary and newly allocated frame stores. The master auxiliary application makes a connection to a slave device and calls the slave auxiliary application that runs on the slave device to communicate the selected content regions. The user can be given the option to display the selected content regions on the slave device in one of three ways: assume the burden of the entire display; Superimpose the selected content regions with the current display content of the slave device (with a cursor to define the level of transparency); and fit both contents on the same screen. The master device can consult the slave device about its display and processing capabilities to determine how processing is to be performed. In some implementations, the slave device will have less processing power and memory than the master device, in which case the master device can be used to perform much of the image processing. In other implementations, the The slave device will have more processing power and memory than the master device, in which case the master device will send the image data to the slave device for reprocessing. The processing that is performed may depend on the display mode that is selected by the user for the slave device. In the event that the display content provided by the master device will occupy all the video on the slave device (ie, “take care”), the master helper application on the master device can obtain the selected content regions from the device's frame store master, resize this content in heap memory to fit the video size of the slave device and send the resized data to the slave auxiliary application, which accepts the data and stores it in the slave device's frame store for display. In the event that the display content provided by the master device overlaps with the content of the slave device (that is, “overlay mode”), the auxiliary master application on the master device asks the slave device to supply its storage content. current frames. This display information provided by the slave device is then merged with the selected content regions of the master device's video in the master device's frame store, after which the master auxiliary application sends the resulting display data to the slave auxiliary application, which puts the data in the frame store of the slave device for display. In case the display content provided by the master device is shown in the video of the slave device next to the display content of the slave device (that is, “fit in both modes”) and the master device has more processing power , the master auxiliary application asks the slave device to provide its current frame store contents, which it receives and resizes to provide space for the selected content regions of the master device's video. the auxiliary master application also resizes the selected content regions of the master device's video so that both displays can fit together in a juxtaposed way within the slave device's display area. The combination of the two resized displays is then sent to the slave auxiliary application, which puts the data in the slave device's frame store for display. In addition to moving a portion of a display from the master device to the slave device, the slave device can accept user input related to the displayed content, which can be passed back to the application that runs on the master device to enable the ability of the user interface on the slave device. The keystrokes received on the slave device are provided to the slave auxiliary application on the master device, which interprets them as input commands and passes the appropriate keystroke information to the application that generates the display through the window manager. The running application can perform the appropriate processing and render the display contents in the secondary frame store as normal, which will result in a corresponding display on the slave device. In one respect, the master auxiliary application and the slave auxiliary application can run concurrently on a single computing device. This aspect allows two computing devices to work with a third computing device referred to as a "proxy device", which can be used to perform some part of the processing associated with the resizing, fitting and / or mixing of the various display contents. In one respect, such a proxy device can only be used if it has the processing power, memory and data connection speed necessary to process the display processing transaction. When a proxy device is used to perform some part of the display processing, both the master device and the slave device send the selected content to the proxy device for reprocessing. The proxy device performs the necessary display image processing and sends the processed data to the slave device for display. The different aspects can be used in different wireless and wired communications networks. As an example, Figure 1 shows a wireless communications network 10 that uses communication links with wireless data and cell phones suitable for use with the various aspects. The communications network 10 can include several computing devices, such as the mobile device 5, with a graphical user interface. The mobile device 5 can be configured with a network antenna and a transceiver to transmit and receive cellular signals 3 to / from a base site or cellular base station 14. In this exemplary network 10, the base station 14 is a part of a cellular network which includes elements needed to power the network, such as a mobile switching center (MSC) 16. In operation, the MSC 16 is capable of routing calls and messages to and from the mobile device 5 via the base station 14 when the mobile device 5 is making and receiving cellular data calls. The mobile device 5 is also capable of sending and receiving data packets through a gateway 18, which connects the cellular network to the Internet 12. The mobile device 5 can also be configured with an antenna and a transceiver to transmit and receive Personal Area Network Signals 2 capable of establishing a personal area network with other computing devices, such as a Bluetooth wireless communication link. The mobile device 5 can use such a personal area network to connect with other computing devices, such as a laptop computer 7, an electronic wristwatch with a programmable video 6 and a digital picture frame 8. Some of the computing, such as the laptop computer 7, can be configured with hardware and network connections to establish a connection to the Internet 12, such as a wireless local area network connection. The use of various aspects with computing devices in the communications network 10 can enable several useful applications. For example, users can run an application on a computing device, such as a mobile device 5 or laptop computer 7, and broadcast some or all of the application views via local area network broadcasts 2 to a display device more appropriate, such as a digital picture frame 8 or an electronic wristwatch video 6. As another example, the user can receive e-mail on a mobile device 5 via a cellular wireless network transmission 3 and be able to see an indication that the email was received or to view parts of the email itself in an electronic wristwatch video 6, with the display information communicated by personal area network broadcasts 2. As another example, the user can access content from a website on the Internet 12 via a wired connection (as shown for laptop computer 7), or via an extended area wireless network transmission 3 (as shown for the mobile device 5) and may elect to display at least parts of that content on a digital picture frame 8 or on an electronic wristwatch video 6, with the display information communicated by the personal area network transmissions 2. Thus, the user can access a streaming video content source on the Internet 12 through a personal computer 7 and present the video images in a digital image frame 8. As described more fully below with reference to In Figures 14 and 15, an aspect allows the display of parts of the image content generated on a first device in the video of a second device that uses the processing power of a third device. This is enabled by the communications network 10, which can allow computing devices, such as a mobile device 5, an electronic wristwatch 6 and a laptop computer 7, to exchange display data via personal area network transmissions. 2. For example, the user who receives display content on a mobile device 5 through an extended area wireless network transmission 3 is able to transfer some display wall to an electronic wristwatch 6 using a laptop computer 7 to perform some part of the image reformatting necessary to fit within the video size of the electronic wristwatch 6, with data communications between the three devices being carried by personal area network transmissions 2. The various aspects can use components that they are found on several computing devices configured with graphical user interfaces (GUI). As is well known in computing techniques, GUI environments can use several pixel arrangements to display graphics. Such arrangements can generally be referred to as storage, trackers, pixel storage, pixel maps or bit maps. The first GUI environments used a single pixel store to display the output of an application on a video (a monitor, for example). Such a pixel store can be referred to as a frame store. In a GUI environment with a single frame store, applications can copy data that corresponds to pixel color values in the frame store, and the monitor can color the screen according to the data stored in the frame store. A frame store that is accessed by a video trigger in order to update the image can be referred to as a system frame store. Pixel storages, which include system frame storages, often use various arrangements using techniques known as double storage and triple storage, but the various storage units can still be referred to as a single storage. Modern GUI environments can allow multiple graphics applications to access the same view through a concept called window view. In such an environment, the operating system can hide the system frame store from most applications. Instead of accessing the system frame store directly, each application can send its display output to a pixel store, which can be referred to as a window store. The window store can be read by the window manager, an application that is part of a GUI environment displayed in a window. The window manager can determine where, if somewhere, within the system frame store the contents of the window store should be stored. For example, a GUI displayed in a window may have three applications running inside windows, for example. If the window for application A is kept to a minimum, its output (that is, the contents of your window store) may not be displayed and the contents of your window store may be ignored by the window manager. If the windows for application B and application C are both active on the desktop, but the window for application B partially obstructs the window for application C (that is, if window B partially overlaps window C) , the window manager can copy the entire contents of the application B window store into the system frame store, while copying only part of the application window C store into the system frame store. In addition to displaying the various windows, a window manager can also provide applications with information about the windows. For example, a window manager can notify an application when its window is reduced to a minimum, resized or hidden from view. The window manager can also provide the window with information such as the size or location of the window. In addition, a window manager can notify an application when the user interacts with the application window (by clicking a mouse button while the mouse pointer is positioned inside the window for that application, for example). The various aspects (the various pixel stores and the various widgets) that constitute an application displayed in a window can be considered child objects of the occurrence of the application displayed in a window. Generally, a simple application such as a text editor will correspond to a single operating system process, which can include multiple streams of execution. Some more complex applications will have multiple processes that appear to the user as an application. As would be understood by those skilled in the art, processes can be connected together as parent and child processes. The previous description is just that of an exemplary method for generating views in a GUI environment displayed in a window. Many window managers, particularly non-composite window managers, do not use a window store for each window. Such window managers can explicitly ask active windows for their exit and notify obstructed windows that their exit is not necessary. In addition, windows may not store a storage for each window element. Instead, some window elements can use vector graphics or a similar method to create pixel images using an algorithm. Some window objects may not dedicate a part of the memory to store the pixel output of their various sub-components. Instead, when asked for their pixel output, such window objects will simply aggregate the pixel output of the various sub-components, which may or may not be based on a dedicated array of pixels stored in memory. Therefore, as used herein, a pixel store (a window store, a display window store or a render store, for example) means either a dedicated part of memory to store pixel values or a temporary part of memory for store pixel values that correspond to the result of a function call. Computing devices configured with GUI environments displayed in a window are not limited to desktop computers. Mobile devices often include GUI environments with a window manager. GUI environments with a window manager can be part of almost any computing device with an integrated video or connection capable of carrying a video signal, such as an HDMI output or . 'simply a network interface. Such devices may include electronic wristwatches, safety glasses for video, digital picture frames, televisions, DVD devices, set-top box converters, to name just a few. By way of illustration, a mobile device 5 and an electronic wristwatch 6 configured with GUI environments displayed in a window are shown in Figures 2A and 2B to show how a graphics application can be shared between multiple videos. In the example shown, a mobile device 5 is shown running a poker application within a GUI 20 displayed in a window in Figure 2A. This illustrative poker app includes an interface video that shows the condition of the game along with the virtual keys 31, 32, 33 to receive touchscreen inputs from the user for game control. GUI 20 displayed in a mobile device window 5 can allow two or more applications to share the same view. Typically, systems with GUI displayed in window allow switching between one application video and another. For example, when the user receives an incoming voice call, the window manager can hide the poker game in order to display the graphical interface for the phone calls application. However, switching between application views may not be ideal in some situations or applications. Mobile device 5 can provide other methods to share the display between multiple applications at the same time, such as performing an alpha blend of the output of one application with another or displaying application interfaces within the traditional mobile and scalable windows familiar to operating system users of table. However, sharing a view is not ideal for some. applications. For example, if the user is watching a video on the mobile device 5 while playing the poker game shown in Figure 2A, the user may wish to view the video on the entire monitor without having to switch between the movie and the game, and without obscuring a part of the video in order to reveal information about the game. The various aspects overcome these disadvantages by allowing an application that runs on one computing device to display on another computing device. Figure 2B shows an electronic wristwatch monitor 6 that has a GUI 40 window to which parts of the poker game display have been transferred from the mobile device 5. The various aspects allow the user to select the parts of the poker application that are most relevant to the user, such as the parts that display their cards and money and display those selected parts on the electronic wristwatch monitor 6. To generate the display image according to one aspect, the user can designate the parts of GUI 20 displayed in a window on the mobile device 5 that must be mixed and transferred to the electronic wristwatch monitor 6. This is shown in Figure 2 A, which shows user selection boundary frames 21-30 that highlight parts of GUI 20 displayed in a window that should appear in GUI 40 displayed in a wristwatch monitor window 6. For example, selection boundary frames 21- 25 select the parts of the poker application that show the values of the cards on the table. Thus, in order to display a display on the electronic wristwatch 6 that shows the condition and values of these cards, the user only needs to select the parts of the display in the bounding boxes 21-25, eliminating the need for the poker app values be interpreted and transformed . >. 'in a second form of display. In addition, the user is able to select the information to be displayed, since the example shows that the user has chosen not to include the series of cards in the transferred display. Alternatively, the application itself can determine which parts of the main display are to be transferred to the slave device. In this regard, the application can be informed of the display capabilities of the slave device and use this information in order to define a display image that fits optimally in that display. For example, if the application is informed that the slave device has a 176 X 144 video, it can render an image suitable for this video size. This may include rendering objects differently based on the pixel resolution and color of the video, such as using simple icons in low resolution displays and using complex icons in high resolution displays. Automatic resizing of display images can also include generating a more extensive and larger display image when the slave device has a larger, more capable video than the master device. For example, if the application is running on a master cell phone device with a 640 X 480 video and the image is being transferred to a 1080P high definition television, the application can render a larger, more detailed display image suitable for the television format. Figures 2A and 2B also show how the virtual keys that appear on the video of a first device can be transferred to the video of a second device. In the example shown, the The user has designated a selection boundary box 30 that encompasses the virtual keys 31, 32, 33 to control the poker game. Consequently, the virtual keys 31, 32, 33 appear in GUI 40 displayed in the window of the electronic response displays 6. As explained more fully below, the methods for reporting the images of the virtual keys to the second device allow translation of the activation of these virtual keys on the second device in the appropriate commands for the application that runs on the first device. Thus, if the user presses the “Elevate” image on the wristwatch with the GUI 40 displayed in a window, this event can be communicated to the mobile device 5 so that it can be interpreted as a press of the “Elevate” 31 soft key as if occurred on the mobile device itself. Figures 2A and 2B show some advantages of several aspects, such as, for example, the processing power and network access capabilities to present a poker application, including enabling online game execution. However, its size may not be suitable for use in all situations, and it may be necessary to keep the video to a minimum during some uses of the mobile device, for example, while making a phone call. On the other hand, the electronic wristwatch monitor 6 is very suitable in the sense that it fits on the wrist and thus can be seen at times when the video from the mobile device 5 may not be. However, the memory and processing power of the electronic wristwatch 6 are necessarily limited by its small size. Thus, aspects allow users to enjoy the use of an application on a suitable computing device, such as an electronic wristwatch monitor, which may not have enough computing power to run the application. In addition, allowing the user to designate the parts of the display to be presented on the second computing device allows users to easily customize an application according to their preferences. Thus, the different aspects can allow users to take advantage of the best aspects of two computing devices. The different aspects can be used in several other ways that can have benefits for users. For example, Figures 3A and 3B show an implementation in which a portion of the desktop display that includes an image is selected and transferred for display in a digital image frame 8. Figure 3A shows a desktop display 55 from a computer workstation on which a web browser is displayed that displays a web cam image. If the user wishes to present the web cam image on another display device, such as a digital picture frame 8, the user can implement an aspect of the present in order to select a part 58 of the desktop display 55 to be transmitted to the digital picture frame 8. As shown in Figure 3B, the various aspects can allow the user to display only the desired part of the web browser display on a peripheral computing device, such as the digital picture frame 8. Computing devices capable of running a GUI displayed in a window can use a window manager to coordinate the sharing of input and output devices between applications in user space. An example of how a window manager 120 can interact with other aspects of a computer operating system 100 is shown in Figure 4, which shows software components that can be implemented on a computing device. The computing device typically uses an operating system 100 to manage various input and output devices, such as the touchscreen sensor 102, a series of buttons 104 and a monitor 106. The various input devices on a computing device they can include both hardware components for converting user inputs to electrical signals and software components, such as a device driver, that allow operating system 100 to properly send electrical signals to applications. the various output devices of a computing device may also include hardware components that physically change based on the received electrical signals, and corresponding software components, such as a device driver, that produce electrical signals based on received commands of other parts of the operating system 100. In the case of a monitor 106, your device driver may include a system frame store. Operating system 100 can allocate some of the input and output resources exclusively to a window manager 120. Operating system 100 can also have additional input and output devices that correspond to hardware and software components that are not allocated to the window manager. windows 120, such as an Internet connection 108 that corresponds to a network interface. Some applications may not require direct user interaction and will only use hardware resources not managed by the window manager 120. An application that works independently of user input can be referred to as a daemon (or daemon application) or termination and stay application as a resident (“TSR”). Operating system 100 may also include a series of application instances 132a, 132b that may require the use of monitor 106. Application instances 132a, 132b may also require periodic user input, such as buttons 104 and / or the touchscreen sensor 102. For each application instance 132a, 132b the window manager can maintain status information in the form of a window object 122a, 122b. Such state information may include the size and shape of the window corresponding to application instance 132a, 132b and an identifier that window manager 120 can use to communicate with application instance 132a, 132b. In one aspect, in which window manager 120 is similar to a "composition" window manager, window object 122a, 122b may include a store that stores the graphical output of application instance 132a, 132b. Some computing devices with smaller monitors may not provide the user with mobile, scalable windows that correspond to applications. A window manager 120 on such a device can simply allow the user to "switch" between application monitors. the various aspects can use a window manager 120 to display an application that runs on a master computing device and that is displayed on a slave computing device (i.e., the target application). An example of a panoramic view of how a window manager 120 can interact with various applications to execute such a display method is shown in Figure 5, which shows software components that can be implemented in master and slave computing devices. Master device 5 can be the computing device (a mobile device, for example) that hosts target application instance 134. Target application instance 134 runs on the processor and memory of master device 5 and directly uses resources master device 5, such as the Internet connection 108. master device 5 can also host another instance of application 132. master device 5 can use a window manager 120 to manage the input and output of the various application instances 132 and 134. As discussed earlier, window manager 120 can use a window object 122 to store state information regarding the various application instances 132 and 134. As described above, the various aspects can use auxiliary applications 150, 160 to coordinate the sharing and communication of display stores of the master and slave devices. As shown in Figure 5, auxiliary master application 150 can be implemented on master device 50 to help prepare display images and storage for communication to slave device 6, and slave auxiliary application 160 can be implemented on slave device 6 to help receive the display stores and render the related images. State information regarding the occurrence of target application 134 can be referred to as a hidden window object 126 while the occurrence of target application 134 is being displayed on a slave device 6. In some ways, the user may have the option of removing the occurrence of target application 134 from the worktable while being displayed on the slave device 6. In this regard, the hidden window object 126 will not be accessed by the aspect of the window manager 120 which aggregates the various windows in the system frame store. The hidden window object 126 may include a store to store the output of the target application 134. The store may be of sufficient size to store the entire output of the target application 134. Alternatively, the store may be of a size equal to the parts selected by user of the target application 134 that will be displayed on the slave device 6. The master auxiliary application 150 can access the storage of the hidden window object 12 and send the display part of the slave device 6 through a personal area network 109 , such as a Bluetooth6 connection. In some respects, the user will have the option of displaying the occurrence of target application 134 on both master device 5 and slave device 6 simultaneously. Such an aspect may not use a store within the hidden window object 126. In such a case, the auxiliary application master 150 can access the system frame store to collect the part to be displayed on the slave device 6. In different aspects, the slave device 6 can implement a window manager 121. slave device 6 can also include a slave auxiliary application 160 to receive the display parts of master device 5 through a connection to personal area network 109. In some ways, the window manager 121 of the slave device 6 can display the received parts by creating a window object 122 corresponding to the slave auxiliary application 160 and displaying the window as if it were a typical window. In some ways, the user may have the option of having the target application instance 134 “taking care” of the slave device's monitor 6 (ie st + ie, full screen mode). Alternatively, the user may have the option of displaying the target application instance 134 as a normal mobile window on the slave device 6. As discussed above with reference to Figure 5, the various aspects can use auxiliary applications to communicate display stores through the master and slave devices. In some respects, auxiliary master and slave applications may include sub-components that run on master and slave devices. Examples of some sub-components that can be implemented to perform the functions of auxiliary applications are shown in Figures 6 and 7, which show software components that can be implemented in master and slave computing devices, respectively. Referring to Figure 6, the window manager 120 of a master device 5 may include a plug-in subcomponent 151 of the master helper application. The master helper application plug-in 151 can provide an interface for retrieving data from a hidden window object 126, which corresponds to the target application instance 134. The master helper application plug-in 151 can also provide an interface to the window manager 120 for receiving information regarding slave device 6, which includes input events, such as a mouse event. In some ways, the slave device 6 can provide window display data, such as the size of the display window on the slave device 6 and whether it is dirty or clogged. Such information can be relayed to the occurrence of application 134 by the master helper application 150 through plug-in 151 of the master helper application. Cc * The auxiliary application master 150 may also include a sub-component of TSR 152 of the auxiliary application master (ie, a “termination and stay as resident” application). The TSR 152 of the master auxiliary application can communicate with other devices to discover any potential slave devices 6. It can also transfer the display store of target application instance 134 to slave devices 6 by consulting window manager 120 via plug-in 151 of the master helper application. In some respects, the TSR component 152 of the master helper application can transform the output of the target application instance 134 based on user preferences and the capabilities of the slave device 6. For example, the target application instance 134 can be designed to run on a mobile device that has no moving and resizable windows. Therefore, the occurrence of target application 134 may not have the intrinsic ability to resize its output to suit a smaller monitor, such as a clock. In such a case, the hidden window 126 may include a display store equivalent to the screen size of the mobile device and the TSR 152 component of the master helper application can crop, resize and rotate the store before passing it to the slave device 6 . The master helper application 150 can also include a user interface 153 of the master helper application. The user interface 153 of the master auxiliary application can provide the user with the ability to define parts of an application to send to a slave device 6 and to define some of the specifics for display, such as the slave device to be used, whether or not to take account of the slave monitor and the O o renewal between master and slave devices. The user interface 153 of the master helper application can be a graphical application with a corresponding window object 122 within the window manager 120. In order to provide the user with the appropriate options, the user interface 153 of the master helper application can gather data about the identity and capabilities of the slave devices 6 of the TSR component 152 of the master helper application. The user interface 153 of the master helper application can also gather information from the window manager 120 via the plug-in 151 of the master helper application, which can be used to provide the user with the ability to define the application parts. With reference to Figure 7, the slave auxiliary application 160 can be made up of several sub-components. The TSR 162 component of the slave auxiliary application can receive a display store from master device 5 and paint it on a corresponding window object 122 ". It can also send data to master device 5 received from window manager 120 that correspond to user input events or other window events, such as an obstruction, in addition, he can consult the window manager 120 regarding his display capabilities via a plug-in 161 of the slave auxiliary application. TSR component 162 of the slave auxiliary application can also communicate with master devices to discover each other.The slave auxiliary application 160 can also include a user interface 163 of the slave auxiliary application to provide the user with the ability to set preferences In some ways, the slave auxiliary application's user interface 163 will provide the user with the ability to accept or re jeopardize certain connections to prevent an unwanted or hostile application from taking over the monitor. The various components shown in Figures 6 and 7 can be classified as slaves or masters for a specific function. A specific computing device can be a slave in some instances or a master in another, while it has only an auxiliary application plug-in, an auxiliary application TSR component, and an auxiliary application user interface. In some ways, the slave and master capabilities can be separated through applications. Alternatively, a computing device capable of being either a slave or a master may have a single plug-in and a single interface, but separate TSR components. A method for establishing a display across multiple computing devices is shown in Figure 8, which shows a process 200 that can be implemented on a computing device. In process 200 in blocks 202 and 203, a master device 5 can start executing a slave auxiliary application TSR 162 component in block 203. In block 204, the master auxiliary application TSR 152 component can locate potential slave devices by sending a broadcast message over a network, such as the discovery frequencies of a Bluetootho device, and receiving a response that includes the display capabilities of the slave devices. In block 208, the master device can receive user inputs that define the parts of the application interface that will be displayed on a slave device in block 208. For example, the user can start the process by entering a keyboard sequence (ctrl + £ f13 , for example), by selecting a menu option a »from the window menu (that is, the menu containing window control options, such as reducing to a minimum and exiting) or introducing a specific gesture on a device with a touchscreen. Touch. The user can then define certain rectangular signs within the target application instance 134 that will be displayed on the slave device. In some ways, the start and define processes can happen simultaneously, as discussed below with reference to Figure 9. In block 214 of process 200, the user interface 214 of the master auxiliary application can provide the user with a list of slave devices that are available (that is, in communication with the master device). In block 220, the master auxiliary application can receive the user selection from a slave device and inform the slave auxiliary application of the selection. In block 222, The slave auxiliary application can cause the slave device 6 to generate a display that prompts the user to confirm the acceptance of the transfer of display images from the master device 5. For example, the generated prompt can inform the user that a computing device has contacted him via a Bluetoothô connection and would like to establish a link that will take over the device's monitor. The slave auxiliary application can be configured to interpret a specific button press as indicating user confirmation of the connection. The slave auxiliary application can determine whether a user input indicates confirmation of transmission acceptance of the display image and, if so, notify the master device that it will accept image data transmissions. This confirmation process is optional and can be performed to provide protection against transfer CC, inadvertent or unauthorized images for a computing device. In some respects, there may be only one possible slave monitor and blocks 214 and 220 can be executed automatically. Once the slave device has been selected and (optionally) the user has accepted the image transfer to the slave device, in block 224 the master and slave devices can negotiate the specific display mode. This negotiation process may include establishing the proportions of the display area available on the slave device, establishing the refresh rate between the devices and determining whether and which window events will be relayed from the slave device to the master device. This negotiation may involve contemporary interaction with the user on one or the other of the devices or on both the master and slave devices, such as, for example, the selection between different display options, and may also involve determining pre-existing user preferences or on the device slave or on the master device. In block 228 of process 200, the window manager 120 of the master device 5 can establish a hidden window 126 for the target application instance 134. In some respects, target application instance 134 may already be painting on a window object 122. Window manager 120 can convert window object 122 into a hidden window object 126 by a series of processes that involve creating additional display buffer. In one respect, in case the window manager 120 is of “composition”, there may have already been a display store associated with window object 122. In block 232, the TSR 152 component of e, auxiliary application master accesses the display store for hidden window object 126 and output it to slave device 6, where it is displayed by the slave device in block 236. The various processes involved in establishing a display on multiple devices can occur in different sequences. In some ways, the helper application may not search for slave devices until the user has defined the display parts in block 214. Process 200 can also be used to display parts of display images from various applications generated on the master device on the slave device. In such implementations, the master device can have two or more applications (or multiple web page instances) that run displayed and, at block 208, it can receive user inputs that define parts of the display images for the various applications. In block 228, the window manager 120 of the master device 5 can establish a hidden window 126 for the various applications. Alternatively, the selection of parts of the images to be transferred to the slave device in block 208 can be performed automatically by the application that generates the image instead of being done by the user. In this regard, the image-generating application can be configured to receive characteristics on a computing device monitor, including the characteristics of a slave device monitor, and to determine an appropriate layout and display content based on those characteristics, which the application uses to define the parts of the display to be transferred to the slave device. The application can identify the image parts defined for the auxiliary application and The master so that it can perform the other operations described here. Different aspects can allow users to define the desired application parts using a mouse or other pointing device to select rectangular signs. Figure 9 shows a user interface gesture suitable for use on computing devices configured with a touch screen user interface. In this regard, the user can define a desired application part by placing a finger 80 in a pre-defined location on the touchscreen, such as the lower left corner, and using two movements with a second finger 82 to define a rectangular sign , a horizontal movement to define the left and right coordinates and a vertical movement to define the top and bottom coordinates. The aspects described above with reference to Figures 5-8 involve implementations in which the master device 5 creates the display parts and outputs these parts to the slave device 6 for processing. A process 300 for performing such a transfer of display from a master device to a slave device is shown in Figure 10. In block 302 of process 300, the occurrence of target application 134 can paint on a hidden window object 126. In block 306 , the auxiliary application master 150 can retrieve the contents of the store in block 306, transform the contents of the store so that they are suitable for display on the slave device and send the results to the slave device in block 310. In the transformation of the contents of the store, the auxiliary application 150 can resize the image contents to fit the size of the monitor and the characteristics of the slave device 6. Alternatively, the auxiliary application 150 can communicate with the application so that, in block 302, the application paints an image on the hidden window object 126 in a size and format suitable for the slave device, so that, in block 310, the application assists liar mestre 150 only needs to present the contents of the storage to the slave device. As noted above, transforming the contents of the store or guiding the application to paint an image on the hidden window object suitable for the slave device can generate a display image that is smaller and less extensive than an image suitable for the master device or an image that is larger and more extensive than an image suitable for the master device. In block 314, slave auxiliary application 160 can receive a display store from the master device, and the window manager 121 from slave device 6 can display the contents in block 318. Slave window manager 121 can display the parts of the occurrence of target application 134 in full screen mode, where the parties use the entire monitor of the slave device (ie, the master device takes care of the slave monitor). Similarly, slave window manager 121 can display the parts in superposition mode, where the parts are subjected to an alpha blend over the other graphics applications on the slave device. In addition, the slave window manager can display the parts in the “snap to both” mode, where the parts are displayed next to the slave device's graphic applications. This can be done by allocating the slave auxiliary application 160 to a movable window object 120. Alternatively, this can be done by allocating a fixed part of the slave monitor to the slave auxiliary application 160 and fitting the rest of the graphics applications the rest. Some computing devices suitable for operation as slave devices may not have the computing power available or may be unable to perform the processing required for the overlay or plug-in display modes in both. In some ways, the slave device is able to send the output of its various graphics applications to the master device, so the master device can perform the transformations. A method for making such a display is shown in Figure 11, which shows a process 320 that can be implemented on various computing devices. In block 302 of process 320, target application instance 134 can paint in a hidden window 126, which can include a window store. As noted above, in an alternative aspect the auxiliary application master 150 can communicate with the application, so that, in block 302, the application paints an image on the hidden window object 126 in a size and format suitable for the slave device . In block 306, the auxiliary application master 150 can retrieve the contents of the store. In block 304, slave window manager 121 can aggregate the contents of the graphics applications and store them in an aggregate storage. This can be done in a manner similar to the way in which the slave window manager 121 would aggregate applications and store them in the system frame store when it does not function as a slave device. In block 308, slave auxiliary application 160 can access the aggregate store and deliver its contents to the master device, where they are received by the auxiliary application master 150. In block 312, the auxiliary application master 150 can transform the contents of the window store, merge the contents with the aggregate slave storage so that they are suitable for display on the slave device and transmit the results to the slave device. In block 314, slave auxiliary application 160 can receive the merged contents of master auxiliary application 150, where the contents are displayed by slave window manager 121 in block 318. In addition to displaying application parts on a slave device, some aspects may allow the user to interact with the target application on the slave device. In a GUI displayed in a typical window, graphics applications can set certain code to be executed when an input event occurs. In the poker application discussed earlier, pressing the touchscreen at a point within a box defined for the “pack” button can cause the poker application to send a data communication to the server, indicating that the user is making compaction. Different aspects can allow an input event on a slave device to execute code on the master device. In the example of the poker application, the user can touch the screen of the slave device and have the poker application that runs on the master device send a message from the master device to the server indicating that the user performs compression. An exemplary method that provides such an interaction is shown in Figure 12, which shows a process 350 that can be implemented on various computing devices. In block 352 of process 350, the slave device can receive a user input in the form of an a + button press on the slave device 6. On slave devices that include a touchscreen monitor, the user input can display in the form of a touch event that includes the user's touch coordinates. In block 356, slave window manager 121 can receive the input signal and determine, from its status information regarding window objects 122, that the input signal belongs to the window managed by slave auxiliary application 160 (ie , the application parts). In block 360, slave window manager 121 can generate a message to be sent to slave auxiliary application 160 indicating the type of input event (ie, a button click) and the specific button pressed or the relative coordinates of the input event. touch of the touchscreen. In block 364, slave auxiliary application 160 can receive the input event from slave window manager 121 and send the input event to master device 5, where it is received by master auxiliary application 150. In block 368, the master auxiliary application 150 can receive the incoming event and determine how the coordinates received correspond to the target application 134 based on the stored information that maps the pixels in the hidden window store 126 in the user-defined application parts. In block 372, the auxiliary application master 150 can send a message to the master window manager 120 that includes the type of input event and the translated coordinates. In block 376, master window manager 120 can receive the message indicating an incoming event and, in response, send a message to target application 134. In block 380, target application 134 can receive the message and determine, with based on the type of input event and the translated coordinates, that the user clicked on a button with a corresponding function (that is, a “click activated” function) and then execute that function. In block 384, the target application can also paint in the hidden window (that is, provide pixel output) based on the function's execution. The various processes involved in displaying application parts on a slave device can be resource intensive. As discussed above with reference to Figure 11, the various aspects can determine how to allocate the processing load based on relative computing capabilities. Some aspects may allow a proxy device to render the application parts and / or combine the application parts with the output from the slave device. For example, the user may wish to display a video on a safety glasses-type computing device, where the video is actually playing on a device (that is, the video player is accessing the video file on the device's storage and decoding the video using the mobile device’s CPU). The mobile device may or may not be able to decode the video and manage the display of safety glasses at the same time, but the User may wish to offload the rendering of the application parts on a nearby device in order to save battery power or reserve processing power for other applications on the mobile device. This can be achieved with an aspect of the present invention under which some part of the processing is performed by a proxy device in communication with the master and slave devices. An example of the various software components that can be implemented in computing devices in such a configuration is shown in Figure 13. As ; , described above, master device 5 can implement a master window manager 120 with a hidden window object 126 that corresponds to a target application instance 134. Master device 5 can also implement a master helper application 150 for communication with devices slaves 6 and proxy devices 7 (a nearby laptop computer, for example) through a connection to the personal area network 109. There may be a slave device 6 that includes a slave window manager 121 with a corresponding window object 122 to a slave auxiliary application 160. The slave auxiliary application 160 can communicate with master devices 5 and proxy devices 7 through a connection to a personal area network 109, such as a Bluetooth network. There may also be a proxy device 7 that includes a proxy auxiliary application 155 for communicating with master devices 52 and slave devices 6 via a connection to a personal area network 109 An exemplary method for displaying a display on multiple devices is shown in Figure 14, which shows a process 390 that can be implemented in various computing devices. In block 302 of process 390, target application instance 134 can paint in a hidden window 126, which can include a window store. In block 306, the auxiliary application master 150 can retrieve the contents of the store and deliver its contents to the auxiliary application proxy 155. As noted above, in an alternative aspect, the auxiliary application master 150 can communicate with the application so that, in block 302, the application paints an image on the hidden window object 126 in a size and format suitable for the slave device. This may include directing the application to paint an image that can be easily aggregated with the content of the slave device. Using information provided by the master helper application, an application can paint an image that is larger or smaller than is suitable for display on the master device. In block 304, the slave window manager 21 can aggregate the contents of the graphics applications and store them in an aggregate store. In block 308, slave auxiliary application 160 can access the aggregate store and deliver its contents to proxy auxiliary application 155. In block 312, proxy auxiliary application 155 can perform processes for mapping the contents of the hidden window store 126 in the display parts and fitting the display parts within the output of other applications on the slave device 6. In block 314, slave auxiliary application 160 can receive a display store from the master device, and window manager 121 from the slave device 6 can display the contents in block 318. In another application of the various aspects, a slave device 6 can be configured to relay display images to a second slave device. Figure 15 shows a diagram of software components of three computing devices 6, 6a, 6 that can allow such image sharing. As described above, master device 5 can implement a master window manager 120 with a hidden window object 126 that corresponds to a target application instance 134. Target device 5 can also implement a master helper application 150 for communicating with slave devices 6a, 6b through a connection to the personal area network 109. There may be a first slave device 6a that includes a slave window manager 12la with a window object 122a which corresponds to a slave auxiliary application 160a. The slave auxiliary application 160a can communicate with master devices 5 and other slave devices 6b through a connection to a personal area network 109a, such as a Bluetoothê network. In addition, the first slave device 6a can include a master auxiliary application 150a for communicating with other slave devices 6b via a connection to the personal area network 109. Similarly, a second slave device 6b can include a proxy auxiliary application 155 for communication with master devices 5 and other slave devices 6 a through a connection to a personal area network 109. When slave devices 6a include both a master auxiliary application 150a and a slave auxiliary application l160a, they can function either as a master device or a slave device or both, so that they can relay a slave display to a second slave device. Processes for relaying a display image to a second slave device 6b are compatible with those described above with reference to Figures 8, 10-12 and 14, with the relay slave device 6a implementing both slave and master device processes. Using this aspect, the user can transfer a display image to his electronic wristwatch monitor and then transfer that display to his friends' electronic wristwatch monitor, so that they can share the experience. Processes 300, 320, 350 and 390 can also be used to transfer display parts from various applications or target web pages that work on the master device to a slave device. To do this, in block 302, each of the applications or web pages can be instructed to paint their display output on the hidden window object 126. Then, each of the processes 300, 320, 350 and 390 proceeds in a different way. similar to that used for a single application view. the aspects described above can be implemented in any of several portable computing devices, such as cell phones, personal data assistants (PDAs), mobile web access devices and other processor-equipped devices that can be developed in the future configured for communicate with external networks, such as over a wireless data link. Typically, such portable computing devices will have in common the components shown in Figure 16. For example, the portable computing device 5 may include a processor 401 coupled to an internal memory 402 and a monitor 403. In addition, the computing device laptop 5 may have a 404 antenna for sending and receiving electromagnetic radiation, which is connected to a wireless data link and / or cell phone transceiver 405 coupled to the 401 processor. Portable computing devices 5 also typically include a 406 keyboard or miniature keyboard and menu selection buttons or 407 oscillating switches to receive user input, as well as a 409 speaker to generate audio output. Several of the aspects described above can also be implemented with any of several computing devices, such as a laptop 7 shown in Figure 17. Such a laptop 7 typically includes a housing 466 that contains a processor 461 coupled with a volatile memory 462 and to a large capacity non-volatile memory, such as disk drive 463. Computer 7 may also include a floppy disk drive 464 and a compact disk drive (CD) 465 coupled to processor 461. The computer housing 466 includes also typically a touch sensitive table 467, a keyboard 468 and monitor 469. Various aspects described above can be implemented with any of several computing devices, such as the wrist computer 6 shown in Figure 18. Such a wrist computer 6 typically includes a housing 486, which contains a processor 481 coupled to volatile memory 482 and a large capacity non-volatile memory, such as as a 483 solid state drive. The computer's 486 housing also typically includes a series of 488 buttons and a touchscreen monitor 489. The 401, 461, 481 processor can be any programmable microprocessor, microcomputer or a multi-processor chip or chips that can be configured by software instructions (applications) to perform various functions, which include the functions of the various aspects described above. On some computing devices, several processors 401, 461, 481 may be featured, such as a processor dedicated to managing data communications and a processor dedicated to running other applications. The various aspects can be implemented by a computer processor 401, 461, 481 that executes software instructions configured to implement one or more of the methods or processes described. Such software instructions can be stored in memory 402, 462, 482 in hard disk memory 464, in a tangible storage medium or on servers accessible over a network (not shown) as separate applications, or as compiled software that implements a method or process. In addition, software instructions can be stored in any form of tangible memory readable by processor, which includes: a random access memory 402, 462, 482, a hard disk memory 463, a floppy disk (readable on a 464 floppy disk drive), a compact disk (readable on a 465 CD drive), an electrically erasable / programmable exclusive readable memory (EEPROM) 483, an exclusive readable memory (such as a FLASH memory) and / or a memory module (not shown) connected to computing device 5, 6, 7, such as an external memory chip or an external memory connectable via USB (a “flash drive,” for example) connected to a port USB network connection. The preceding method descriptions and process flow diagrams are presented as illustrative examples only and are not intended to require or imply that the processes of the various aspects must be performed in the order presented. As will be understood by those skilled in the art, the order of the blocks and processes under the preceding aspects can be any. Words such as "next", "then", "next", etc., are not intended to limit the order of processes; these words are simply used to guide the reader through the description of the methods. In addition, any reference to claim elements in the singular, using the articles “one (a)” or “the (s)”, for example, should not be interpreted as limiting the element to the singular. the various blocks, modules, circuits and illustrative logic algorithm processes described in connection with the aspects disclosed here can be implemented as electronic hardware, computer software or combinations of both. To clearly illustrate this interchangeability of hardware and software, several components, blocks, modules, circuits and algorithms were described above generically in terms of their functionality. Whether such functionality is implemented as hardware or software depends on the specific application and the design limitations imposed on the system as a whole. Those skilled in the art can implement the functionality described in different ways for each specific application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure. The hardware used to implement the various blocks, modules and illustrative logic circuits described in connection with the aspects disclosed here can be implemented or executed with a general purpose processor, a digital signal processor (DSP), an application-specific integrated circuit ( ASIC), a field programmable port arrangement (FPGA) or other programmable logic device, discrete port or transistor logic, discrete hardware components or any combination of them designed to perform the functions described here. A general purpose processor can be a microprocessor, but, alternatively, the processor can be any processor, controller, microcontroller or state machine. A processor can also be implemented as a combination of computing devices, such as, for example, a combination of DSP core and microprocessor, a series of microprocessors, one or more microprocessors together with a DSP core, or any other such configuration . Alternatively, some processes or methods can be performed by a set of circuits that is specific to a given function. Under one or more exemplary aspects, the functions described can be implemented in hardware, software, firmware or any combination of them. If implemented in software, the functions can be stored in or transmitted through one or more instructions or code in a medium that can be read by a computer. Computer-readable media includes both computer storage media and communication media that include any medium that facilitates the transfer of a computer program from one place to another. A storage medium can be any available medium that can be accessed by a general purpose or special purpose computer. By way of example, and not by way of limitation, such a computer-readable medium may comprise RAM, ROM, EEPROM, CD-ROM or any other optical disk storage, magnetic disk storage or other magnetic storage devices or any other medium that can be used to port or store desired program code devices in the form of instructions or data structures and that can be accessed by a general purpose or special purpose computer. In addition, any connection is appropriately called a computer-readable medium. For example, if the software is transmitted from a website, server or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL) or wireless technologies such as infrared, radio and microwave , then coaxial cable, fiber optic cable, twisted pair, DSL or wireless technologies such as infrared, radio and microwave are included in the media definition. The term disc (disk and disc in the original), as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disc and blu-ray disc, in which usually discs ) reproduce data magnetically, while discs (discs) reproduce data optically with lasers. Combinations of them should also be included within the range of computer-readable media. In addition, the operations of a method or algorithm can reside as one or any combination or set of codes and / or instructions stored on a machine-readable medium and / or a computer-readable medium, which can be incorporated into a computer program product. The foregoing description of the disclosed modalities is presented to allow anyone skilled in the art to manufacture or use the present invention. Several changes in these modalities will be readily apparent to those skilled in the art, and the generic principles defined herein can be applied to other modalities without abandoning the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the modalities shown here, but should receive the widest range compatible with the principles and unpublished aspects disclosed here.
权利要求:
Claims (15) [1] 1. Method for displaying one or more selected parts (31, 32, 33, 58) of a display image generated and displayed on a first computing device (5) that implements a master auxiliary application (150) on a monitor of a second computing device (6, 7, 8) that implements a slave auxiliary application (160), characterized by the fact that it comprises: reformatting a display image generated by an application (132) that runs on the first computing device (5) in order to fit the monitor of the second computing device (6, 7, 8) and store the reformatted display image in a frame store of the first computing device (5) as a hidden window object (126) under the guidance of the master helper application (150), the master helper function (150) allowing a user to select one or more parts (31,32, 33, 58) of the display image displayed on the first computing device (5) to share on the second compu tation (6, 7, 8); transmitting the display data of the hidden window object (126) to the second computing device (6, 7, 8) via communication between the master auxiliary application (150) and the slave auxiliary application (160); storing the display data of the hidden window object (126) in a frame store of the second computing device (6, 7, 8) under the guidance of the slave auxiliary application (160); and rendering the display on the second computing device (6, 7, 8) using the display data of the hidden window object (126) stored in the frame store of the second computing device (6, 7, 8). [2] 2. Method, according to claim 1, characterized by the fact that: reformatting a display image to fit the monitor of the second computing device (6, 7, 8) and storing the reformatted display image in a frame store of the first computing device (5) as a hidden window object (126) under the guidance of the master auxiliary application (150) comprises: orienting an application that runs on the first computing device (5) to paint a part of the display image from the application to the frame store of the first computing device (5) as a hidden window object (126); and reformatting the display data of the hidden window object (126) to fit the monitor of the second computing device (6, 7, 8); and in particular: the step of reformatting the display data of the hidden window object (126) to fit the monitor of the second computing device (6, 7, 8) is performed on the first computing device (5) under the guidance of the master auxiliary application (150); and the step of transmitting display data from the hidden window object (126) to the second computing device (6, 7, 8) comprises transmitting display data from the resized hidden window object (126) to the second computing device (6, 7, 8); Or the step of reformatting the display data of the hidden window object (126) to fit the monitor of the second computing device (6, 7, 8) is performed on the second computing device (6, 7, 8) under the guidance the slave auxiliary application (160); and the step of transmitting the display data of the hidden window object (126) to the second computing device (6, 7, 8) comprises transmitting the original dimensioned display data of the hidden window object to the second computing device (6 , 7, 8); or transmit the display data of the hidden window object (126) to a third computing device, wherein: the step of reformatting the display data of the hidden window object (126) to fit the monitor of the second computing device ( 6, 7, 8) is performed on the third computing device; and the step of transmitting display data from the hidden window object (126) to the second computing device (6, 7, 8) comprises transmitting display data from the resized hidden window object (126) from the third computing device to the second computing device (6, 7, 8); or reformatting the display data of the hidden window object (126) to fit the monitor of the second computing device (6, 7, 8) under the guidance of the master auxiliary application (150) comprises processing the display data of the window object hidden (126) so that the data generates the display image compatible with the monitor of the second computing device (6, 7, 8); or receive display data from the second display device (6, 7, 8), where you reformat the display data of the hidden window object (126) to fit the monitor of the second computing device (6, 7, 8) under the orientation of the master auxiliary application (150) comprises generating a mixture of the display data of the hidden window object (126) and the display data of the second computing device (6, 7, 8) received to generate a single display image merged compatible with the monitor of the second computing device (6, 7, 8); or receive display data from the second computing device (6, 7, 8), in which to reformat the display data of the hidden window object (126) to fit the monitor of the second computing device (6, 7, 8) under the orientation of the master auxiliary application (150) comprises generating a single display image compatible with the monitor of the second computing device (6, 7, 8) that presents the display data of the hidden window object (126) side by side with the display data of the second computing device received (6, 7, 8); or transmitting the display data of the hidden window object (126) to the second computing device (6, 7, 8) comprises transmitting the display data of the hidden window object (126) to the second computing device (6, 7, 8) through a wireless data link established between the first and second computing devices (6, 7, 8), where the wireless data link is a Bluetoothºe wireless data link. [3] 3. Method according to claim 1, characterized by the fact that it additionally comprises: notifying the second computing device (6, 7, 8) that parts (31, 32, 33, 58) of a display image can be transmitted to him; show notice to a user of the second computing device (6, 7, 8) to confirm the agreement to receive the part of the display image; determine if the user of the second computing device (6, 7, 8) confirmed the agreement to receive the part of the display image; and receiving display data from the hidden window object (126) on the second computing device (6, 7, 8) if it is determined that the user of the second computing device (6, 7, 8) has confirmed the agreement to receive the part (31, 32, 33, 58) of the display image. [4] 4. Method, according to claim 1, characterized by the fact that reformatting a display image generated by an application that runs on the first computing device (5) to fit the monitor of the second computing device (6, 7, 8 ) and storing the reformatted display image in a frame store of the first computing device (5) as a hidden window object (126) under the guidance of the master auxiliary application (150) comprises: providing features of the monitor of the second computing device computing to the application that runs on the first computing device (5); and receiving a display image of the application in the frame store of the first computing device (5) in a format compatible with the monitor of the second computing device (6, 7, 8); in particular, in which the display image received from the application is scaled to a monitor that is larger than a monitor of the first computing device (5). [5] 5. Method according to claim 1, characterized by the fact that it further comprises: transmitting the display data of the hidden window object (126) from the second computing device (6, 7, 8) to a third computing device ; storing the display data of the hidden window object (126) received in a frame store of the third computing device; and rendering a display on the third computing device using the display data of the hidden window object (126) stored in the frame store of the third computing device. [6] 6. Method for sharing one or more parts (31, 32, 33, 58) selected from a display image generated and displayed on a first computing device (5) that implements a master auxiliary application (150) on a monitor of a second computing device (6, 7, 8) that implements a slave auxiliary application (160), characterized by the fact that it comprises: reformatting a display image generated by an application that runs on the first computing device (5) in order to fit on the monitor of the second computing device (6, 7, 8) and store the reformatted display image in a frame store on the first computing device (5) as a hidden window object (126) under the guidance of the application auxiliary master (150), the auxiliary master function (150) allowing a user to select one or more parts (31, 32, 33, 58) of the display image displayed on the first computing device (5) to share on the second device from comp use (6, 7, 8); and transmitting the display data of the hidden window object (126) to the second computing device (6, 7, 8). [7] 7. Computing device (5) adapted to display a display image, characterized by the fact that it comprises: mechanisms to reformat the display image generated by an application that runs on the computing device (5) to fit on a monitor of a second computing device (6, 7, 8); mechanisms for storing the reformatted display image in a frame store as a hidden window object (126); and mechanisms for transmitting display data from the hidden window object (126) to the second computing device (6, 7, 8) via a transceiver (405), in which the computing device (5) is adapted to provide a master auxiliary function (150) allowing a user to select one or more parts (31, 32, 33, 58) of the display image displayed on the computing device (5) to share on the second computing device (6, 7, 8) . [8] 8. Computing device (5), according to claim 7, characterized by the fact that the mechanisms for reformatting a display image generated by an application that runs on the computing device (5) comprise: mechanisms to guide an application that runs on the processor to paint a part of the application's display image in the frame store as a hidden window object (126); and mechanisms for reformatting the display data of the hidden window object (126) to fit the monitor of the second computing device. [9] 9. Computing device (5) according to claim 7, characterized by the fact that: the mechanisms for transmitting the display data from the hidden window object (126) to the second computing device (6, 7, 8 ) comprise mechanisms for transmitting display data from the reformatted hidden window object (126) to the second computing device (6, 7, 8) j and in particular: the mechanisms for transmitting display data from the hidden window object (126) to the second computing device (6, 7, 8) comprise mechanisms for transmitting the display data from the original dimensioned hidden window object (126) to the second computing device (6, 7, 8); or further comprising: mechanisms for receiving display data from the second computing device (6, 7, 8), wherein the mechanisms for reformatting the display data of the hidden window object (126) to fit on a monitor of the second computing device computation (6, 7, 8) comprise mechanisms for generating a mixture of the display data of the hidden window object (126) and the display data of the second computing device (6, 7, 8) received in order to generate a single merged display image compatible with the monitor of the second computing device (6, 7, 8); Or mechanisms for receiving display data from the second computing device (6, 7, 8), wherein the mechanisms for reformatting the display data of the hidden window object (126) to fit on a monitor of the second computing device (6 , 7, 8) comprise mechanisms to generate a single display image compatible with the monitor of the second computing device (6, 7, 8) that presents the display data of the hidden window object (126) side by side with the data display of the second computing device (6, 7, 8) received. [10] 10. Computing device (5) according to claim 7, characterized by the fact that it comprises "additionally mechanisms for notifying the second computing device (6, 7, 8) of which parts (31, 32, 33, 58) of a display image can be transmitted to it; or what mechanisms to reformat a display image generated by an application that runs on the computing device (5) to fit on a monitor of the second computing device (6, 7, 8) comprise: mechanisms for providing characteristics of the second monitor computing device (6, 7, 8) to the application that runs on the computing device (5); and mechanisms “to receive a display image of the application in the frame store in a format compatible with the monitor of the second computing device (6, 7, 8). [11] 11. Communication system, characterized by the fact that it comprises: a first communication device (5) adapted to display a display image; and a second communication device (6, 7, 8), in which The first communication device (5) comprises: mechanisms for storing the display image generated by an application that runs on the first communication device (5) in a first frame store as a hidden window object (126); and mechanisms for transmitting display data from the hidden window object (126) to the second communication device (6, 7, 8), and in which the first communication device (5) is adapted to provide a master auxiliary function ( 150) allowing a user to select one or more parts (31, 32, 33, 58) of the display image displayed on the first communication device (5) to share on the second communication device (6, 7, 8) je in which the second communication device (6, 7, 8) comprises: mechanisms for receiving hidden window object display data (126) from the first communication device (5); mechanisms for storing the display data of the hidden window object (126); and mechanisms for rendering an image using the display data of the hidden window object (126). [12] 12. Communication system, according to claim 11, characterized by the fact that: the first computing device (5) additionally comprises: mechanisms to guide an application that runs on the first computing device (5) to paint a part of display image of the application in a frame store as a hidden window object (126); and mechanisms for reformatting the display data of the hidden window object (126) to fit on a monitor of the second computing device (6, 7, 8); or mechanisms to reformat the display data of the hidden window object (126) to fit on a monitor of the second computing device (6, 7, 8); and mechanisms for transmitting display data from the hidden window object (126) to the second computing device (6, 7, 8) comprise mechanisms for transmitting display data from the hidden window object (126) to the second device computing (6, 7, 8); or the second communication device is configured with executable instructions per processor to execute processes that comprise: reformat the display data of the hidden window object (126) to fit the second monitor; or further comprising a third computing device, comprising: mechanisms for receiving display data from the hidden window object (126) of the first computing device; mechanisms for reformatting the display data of the hidden window object (126) received to fit on a monitor of the second computing device (6, 7, 8) j and mechanisms for transmitting the display data of the reformatted hidden window object (126) for the second computing device (6, 7, 8), where: the mechanisms for transmitting, from the first computing device (5), the display data of the hidden window object (126) to the second computing device ( 6, 7, 8) comprise mechanisms for transmitting the display data of the hidden window object (126) to the third computing device for processing; and the mechanisms for receiving, from the second computing device (6, 7, 8), the display data of the hidden window object (126) of the first computing device (5) comprise mechanisms for receiving the display data from the object of hidden window (126) via The third computing device; or the first computing device further comprising: mechanisms for receiving user input that indicates a selection of the display image to be displayed on the second computing device (6, 7, 8); mechanisms to guide an application that runs on the first processor to paint the selected selected part of the application's display image in a frame store as a hidden window object (126); and mechanisms for reformatting the display data of the hidden window object (126) to fit on a monitor of the second computing device (6, 7, 8); or the second computing device (6, 7, 8) further comprising: mechanisms for receiving user input; and mechanisms for communicating information regarding the user input received to the first computing device (5); and the first computing device (5) additionally comprising: mechanisms for receiving information regarding user input received; mechanisms to correlate the information received regarding the user input received with the display image portion of the application in order to determine a user input that corresponds to the application operating on the first computing device (5) and mechanisms “to communicate the input of corresponding to the application operating on the first computing device. [13] 13. Communication system according to claim 11, characterized by the fact that: the first computing device (5) comprises "additionally mechanisms for notifying the second computing device (6, 7, 8) of which parts (31 , 32, 33, 58) of a display image can be transmitted to it, and in which the second computing device (6, 7, 8) further comprises: mechanisms for showing warning to a user of the second computing device (6 , 7, 8) to confirm the agreement to receive the display image portion; mechanisms for receiving a user input; mechanisms for determining whether the user input received confirmed the agreement to receive the display image portion; and mechanisms for accept the display data of the hidden window object (126) if it is determined that the user input has confirmed the agreement to receive the part of the display image, in particular: mechanisms for transmitting an alert to the first the computing device (5) of which parts (31, 32, 33, 58) of a display image will be accepted if it is determined that the user input has confirmed agreement to receive the display image part. [14] 14. Communication system according to claim 11, characterized by the fact that: the first computing device (5) additionally comprises: mechanisms for providing characteristics of a monitor of the second computing device (6, 7, 8) the application that runs on the first computing device (5); and mechanisms “to receive a display image of the application in a frame store in a format compatible with the monitor of the second computing device (6, 7, 8); or comprises a fourth computing device, wherein the second computing device (6, 7, 8) further comprises mechanisms for transmitting display data from the hidden window object (126) to the fourth computing device, and wherein the fourth computing device comprises: mechanisms for receiving display data from the hidden window object (126) from the second computing device (6, 7, 8); mechanisms for storing the display data of the hidden window object received (126); and mechanisms for rendering a display using the display data of the hidden window object (126). [15] 15. Computer-readable storage medium, characterized by the fact that it comprises codes for carrying out the method steps as defined in claims 1-6, when executed on a computer.
类似技术:
公开号 | 公开日 | 专利标题 BR112012005662A2|2020-09-15|method and apparatus for providing application interface parts on peripheral computer devices. US20150067536A1|2015-03-05|Gesture-based Content Sharing Between Devices CN108874341B|2021-09-14|Screen projection method and terminal equipment AU2013345759B2|2016-01-07|Transmission system and program AU2014263305A1|2015-08-20|Method and apparatus for displaying user interface through sub device that is connectable with portable electronic device RU2700188C2|2019-09-13|Representing computing environment on multiple devices US20100026608A1|2010-02-04|Remote desktop client peephole movement JP2006072962A|2006-03-16|Control system for controlling a plurality of target computers US20200059444A1|2020-02-20|Task management based on instant message JP6540367B2|2019-07-10|Display control apparatus, communication terminal, communication system, display control method, and program US20160070421A1|2016-03-10|Information Processing Method And Electronic Apparatus JP2013130800A|2013-07-04|Information processor, screen output system, screen output control method and program US10154171B2|2018-12-11|Image forming apparatus, cloud server, image forming system, and method for setting connection with image forming apparatus US20120254381A1|2012-10-04|Method and apparatus of capturing a screen image of a remotely managed machine JP2018525744A|2018-09-06|Method for mutual sharing of applications and data between touch screen computers and computer program for implementing this method CN104219274A|2014-12-17|Drive transmission device and image forming apparatus including same US20150326705A1|2015-11-12|Mobile Device Data Transfer Using Location Information CA2900169A1|2016-02-15|Wireless access point for facilitating bidirectional, application-layer communication among computing devices CA2671459C|2013-09-10|Remote desktop client peephole movement KR20150116875A|2015-10-16|Ad-hoc device sharing over a network US20180109631A1|2018-04-19|Browsing session transfer using qr codes US9973554B2|2018-05-15|Interactive broadcasting between devices US9514710B2|2016-12-06|Resolution enhancer for electronic visual displays US20110320962A1|2011-12-29|Information processing apparatus, control method therefor, and program JP2015162040A|2015-09-07|Electronic apparatus
同族专利:
公开号 | 公开日 WO2011032152A1|2011-03-17| CN102725727B|2015-11-25| CN102725727A|2012-10-10| KR101385364B1|2014-04-14| JP2013504826A|2013-02-07| US20110066971A1|2011-03-17| JP5681191B2|2015-03-04| KR20120061965A|2012-06-13| EP2478434A1|2012-07-25|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 JPS62290287A|1986-06-10|1987-12-17|Nec Corp|Image transmission method| JPH08101669A|1994-09-30|1996-04-16|Semiconductor Energy Lab Co Ltd|Display device drive circuit| JPH09231044A|1996-02-26|1997-09-05|Canon Inc|System and method for sharing screen| US6216141B1|1996-12-06|2001-04-10|Microsoft Corporation|System and method for integrating a document into a desktop window on a client computer| US5801691A|1996-12-31|1998-09-01|International Business Machines Corporation|Method and apparatus for mobile device screen reformatting utilizing hypertext| US5798759A|1996-12-31|1998-08-25|International Business Machines Corporation|Method and apparatus for mobile device screen reformatting| US6278448B1|1998-02-17|2001-08-21|Microsoft Corporation|Composite Web page built from any web content| US6993575B2|2000-02-22|2006-01-31|Oracle International Corporation|Using one device to configure and emulate web site content to be displayed on another device| US6704024B2|2000-08-07|2004-03-09|Zframe, Inc.|Visual content browsing using rasterized representations| US7221370B1|2001-01-26|2007-05-22|Palmsource, Inc.|Adaptive content delivery| US6981227B1|2002-02-04|2005-12-27|Mircrosoft Corporation|Systems and methods for a dimmable user interface| US20030156131A1|2002-02-21|2003-08-21|Samir Khazaka|Method and apparatus for emulating a mobile device| EP1377023A1|2002-06-28|2004-01-02|Océ-Technologies B.V.|Image scanning and processing system, method of scanning and processing image and method of selecting one of a plurality of master files comprising data encoding a scanned image| US20040098360A1|2002-11-15|2004-05-20|Humanizing Technologies, Inc.|Customized life portal| US7574691B2|2003-03-17|2009-08-11|Macrovision Corporation|Methods and apparatus for rendering user interfaces and display information on remote client devices| US7623722B2|2003-10-24|2009-11-24|Eastman Kodak Company|Animated display for image manipulation and correction of digital image| JP4342961B2|2004-01-16|2009-10-14|パイオニア株式会社|Information distribution display system and information distribution method| US20050186913A1|2004-02-24|2005-08-25|Research In Motion Limited|Remote user interface| US7278092B2|2004-04-28|2007-10-02|Amplify, Llc|System, method and apparatus for selecting, displaying, managing, tracking and transferring access to content of web pages and other sources| US20060236375A1|2005-04-15|2006-10-19|Tarik Hammadou|Method and system for configurable security and surveillance systems| US7533189B2|2005-06-21|2009-05-12|Microsoft Corporation|Enabling a graphical window modification command to be applied to a remotely generated graphical window| US7735018B2|2005-09-13|2010-06-08|Spacetime3D, Inc.|System and method for providing three-dimensional graphical user interface| US20070067305A1|2005-09-21|2007-03-22|Stephen Ives|Display of search results on mobile device browser with background process| US8004535B2|2006-06-01|2011-08-23|Qualcomm Incorporated|Apparatus and method for selectively double buffering portions of displayable content| US9064028B2|2007-04-04|2015-06-23|The Hong Kong University Of Science And Technology|Custom rendering of webpages on mobile devices| CN101344849A|2008-08-22|2009-01-14|四川长虹电器股份有限公司|Method for implementing input method superposition in embedded type GUI surroundings| US8448074B2|2009-05-01|2013-05-21|Qualcomm Incorporated|Method and apparatus for providing portioned web pages in a graphical user interface|US9313769B2|2008-01-14|2016-04-12|Qualcomm Incorporated|Wireless communication paging and registration utilizing multiple types of node identifiers| US9094933B2|2008-01-14|2015-07-28|Qualcomm Incorporated|Wireless communication paging utilizing multiple types of node identifiers| US20090182871A1|2008-01-14|2009-07-16|Qualmcomm Incorporated|Backup paging for wireless communication| JP4518181B2|2008-04-17|2010-08-04|セイコーエプソン株式会社|Image transmission apparatus, display system, image transmission program, and recording medium| US8448074B2|2009-05-01|2013-05-21|Qualcomm Incorporated|Method and apparatus for providing portioned web pages in a graphical user interface| US20110119454A1|2009-11-17|2011-05-19|Hsiang-Tsung Kung|Display system for simultaneous displaying of windowsgenerated by multiple window systems belonging to the same computer platform| US9003309B1|2010-01-22|2015-04-07|Adobe Systems Incorporated|Method and apparatus for customizing content displayed on a display device| US10996774B2|2010-04-30|2021-05-04|Nokia Technologies Oy|Method and apparatus for providing interoperability between devices| US20110273393A1|2010-05-06|2011-11-10|Wai Keung Wu|Method and Apparatus for Distributed Computing with Proximity Sensing| US10162491B2|2011-08-12|2018-12-25|Otoy Inc.|Drag and drop of objects between applications| EP2793136A4|2011-12-15|2016-03-09|Sony Computer Entertainment Inc|Information processing system and content download method| US9513793B2|2012-02-24|2016-12-06|Blackberry Limited|Method and apparatus for interconnected devices| US9275142B2|2012-02-29|2016-03-01|Nokia Technologies Oy|Method and apparatus for multi-browser web-based applications| US9575710B2|2012-03-19|2017-02-21|LenovoCo., Ltd.|Electronic device and information processing method thereof| US9733882B2|2012-04-19|2017-08-15|Videro Llc|Apparatus and method for coordinating visual experiences through visual devices, a master device, slave devices and wide area network control| US8970492B2|2012-06-08|2015-03-03|Microsoft Technology Licensing, Llc|Remote session control using multi-touch inputs| US20140075377A1|2012-09-10|2014-03-13|Samsung Electronics Co. Ltd.|Method for connecting mobile terminal and external display and apparatus implementing the same| JP5949406B2|2012-10-01|2016-07-06|株式会社デンソー|Unit operation system, slave display device used in the system, and master display device| CN104040538B|2012-12-18|2017-06-06|华为技术有限公司|A kind of the Internet, applications exchange method, apparatus and system| US9836437B2|2013-03-15|2017-12-05|Google Llc|Screencasting for multi-screen applications| KR102189679B1|2013-07-12|2020-12-14|삼성전자주식회사|Portable appratus for executing the function related to the information displyed on screen of external appratus, method and computer readable recording medium for executing the function related to the information displyed on screen of external appratus by the portable apparatus| KR102208436B1|2013-08-06|2021-01-27|삼성전자주식회사|Method for displaying and an electronic device thereof| CN103530149A|2013-09-27|2014-01-22|深圳市同洲电子股份有限公司|Configuration method for gamepad simulation configuration file and terminal| KR102107404B1|2013-10-30|2020-05-07|삼성전자주식회사|Apparatus for sharing application and method for controlling thereof| CN103558959B|2013-10-31|2016-08-17|青岛海信移动通信技术股份有限公司|A kind of method and apparatus of the display window being applied to Android platform| US9550118B2|2013-11-13|2017-01-24|Gaijin Entertainment Corp.|Method for simulating video games on mobile device| JP6307889B2|2014-01-16|2018-04-11|セイコーエプソン株式会社|Display device, display system, and display method| US10078481B2|2014-01-29|2018-09-18|Intel Corporation|Secondary display mechanism| CA2841371A1|2014-01-31|2015-07-31|Usquare Soft Inc.|Devices and methods for portable processing and application execution| US9692701B1|2014-04-10|2017-06-27|Google Inc.|Throttling client initiated traffic| KR102288726B1|2014-06-09|2021-08-12|삼성전자주식회사|Wearable eletronic apparatus, system and controllin method thereof| CN104053057B|2014-06-09|2019-02-19|青岛海信移动通信技术股份有限公司|A kind of method of HardwareUpgring, equipment and system| JP2016035706A|2014-08-04|2016-03-17|パナソニックIpマネジメント株式会社|Display device, display control method and display control program| JP2016035705A|2014-08-04|2016-03-17|パナソニックIpマネジメント株式会社|Display device, display control method and display control program| US20160048296A1|2014-08-12|2016-02-18|Motorola Mobility Llc|Methods for Implementing a Display Theme on a Wearable Electronic Device| US9678640B2|2014-09-24|2017-06-13|Microsoft Technology Licensing, Llc|View management architecture| US10448111B2|2014-09-24|2019-10-15|Microsoft Technology Licensing, Llc|Content projection| US10025684B2|2014-09-24|2018-07-17|Microsoft Technology Licensing, Llc|Lending target device resources to host device computing environment| US10635296B2|2014-09-24|2020-04-28|Microsoft Technology Licensing, Llc|Partitioned application presentation across devices| US9860306B2|2014-09-24|2018-01-02|Microsoft Technology Licensing, Llc|Component-specific application presentation histories| US9769227B2|2014-09-24|2017-09-19|Microsoft Technology Licensing, Llc|Presentation of computing environment on multiple devices| CN104587669B|2015-01-30|2018-03-23|北京视博云科技有限公司|A kind of method for customizing of virtual peripheral| US10175866B2|2015-06-05|2019-01-08|Apple Inc.|Providing complications on an electronic watch| US10572571B2|2015-06-05|2020-02-25|Apple Inc.|API for specifying display of complication on an electronic watch| CN105389150B|2015-11-05|2018-10-12|广东威创视讯科技股份有限公司|A kind of picture display control and device| US10748312B2|2016-02-12|2020-08-18|Microsoft Technology Licensing, Llc|Tagging utilizations for selectively preserving chart elements during visualization optimizations| US10347017B2|2016-02-12|2019-07-09|Microsoft Technology Licensing, Llc|Interactive controls that are collapsible and expandable and sequences for chart visualization optimizations| US10423321B2|2017-11-02|2019-09-24|Dell Products L. P.|Defining a zone to perform an action in a dual-screen tablet| US11269698B2|2018-10-04|2022-03-08|Google Llc|User interface systems and methods for a wearable computing device|
法律状态:
2020-09-29| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2020-10-06| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]| 2020-10-27| B08F| Application dismissed because of non-payment of annual fees [chapter 8.6 patent gazette]|Free format text: REFERENTE AS 9A E 10A ANUIDADES. | 2021-01-12| B11B| Dismissal acc. art. 36, par 1 of ipl - no reply within 90 days to fullfil the necessary requirements| 2021-11-23| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US12/558,936|US20110066971A1|2009-09-14|2009-09-14|Method and apparatus for providing application interface portions on peripheral computing devices| US12558936|2009-09-14| PCT/US2010/048786|WO2011032152A1|2009-09-14|2010-09-14|Method and apparatus for providing application interface portions on peripheral computer devices| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|